Web Survey Bibliography
Relevance & Research Question: Open-ended questions are often used to gather short numeric information in self-administered web questionnaires. Respondents are encouraged to enter numbers, quantities or frequencies into input fields, most often without any computerized formatting constraints predominantly in order to prevent item nonresponse. However, the absence of any formatting restrictions encourages large variances in answers deviating from the desired format, including value ranges, estimations, alphanumeric supplements, or even different measuring units which affect data quality negatively, and increase the efforts for data cleansing and preparation. Thus, concise and clear formatting instructions are needed to guide respondents providing answers in the desired format. Considering the fact that instructions are likely to be ignored the question arises how different modes of verbal instructions and visual cues can be applied to improve the impact of formatting instructions, and finally to enhance data quality.
Methods & Data: In a between-subjects field experiment conducted among university freshman students in an opt-in panel (N=670), we tested different visual modes of formatting instructions in open-ended numeric questions: (1) conventional instruction in a static manner, (2) dynamic instruction in a tooltip appearing when the mouse cursor hovers over the input field, and (3) symbolic instruction in terms of pre-defined default values in the input field indicating the desired response format. The effectiveness of each instruction mode was determined by the proportion of formally correct answers.
Results: Findings indicated that the implementation of dynamic formatting elements in terms of tooltips or default values had no positive effect on an improvement of response quality compared to conventional static formatting instructions. Even a combination of tooltips and pre-filled symbols could not achieve a significant increase in correctly formatted answers compared to the sole presentation of a fixed instruction.
Added Value: The results indicated that static formatting instructions should not be replaced hastily without examining the effect of dynamic elements sufficiently. However, initial findings suggested the potential of dynamic formatting instructions in enhancing the positive effect of conventional instructions.
GOR Homepage (abstract) / (presentation)
Web survey bibliography - 2012 (371)
- Investigating the Impact of the Number of Grid Items on Web Survey Responses; 2012; Guo, F., Nunge, E.
- Effects of Progress Indicators on Short Questionnaires; 2012; Sedley, A., Callegaro, M.
- The JanDY Online Survey System; 2012; Kamstra, J., Henley, M., Johnson, M.
- The Detection and Effects of Data From Potentially Ineligible Participants in Online Survey Research...; 2012; Grey, J.
- Guidelines for developing a robust web survey; 2012; Naithani, P.
- Internet Mobility Survey Sampling Biases in Measuring Frequency of Use of Different Transport Modes; 2012; Diana, M.
- Bringing data from an online survey or spreadsheet into SPSS; 2012; Raftery, D.
- Using e-surveys to access the views of football fans within online communities; 2012; Gibbons, T., Nuttall, D.
- Coverage error in internet surveys Can fixed phones fix it?; 2012; Vicente, P., Reis, E.
- Exploring the impact of animation-based questionnaire on conducting a web-based educational survey and...; 2012; Chien, Y.-T., Chang, C.-Y.
- Marktforschung mit dem iPad-Panel von Axel Springer Media Impact; 2012
- Online and Paper-Based: A Mixed-Method Approach to Conducting a Needs Assessment Survey of Physicians...; 2012; Olatunbosun, T., Wu, C., Grewal, G., Lynn, B.
- The “frequency divide”: implications for internet-based surveys; 2012; Vicente, P., Reis, E.
- Effects of Personalized Versus Generic Implementation of an Intra-Organizational Online Survey on Psychological...; 2012; Mueller, K., Straatmann, T., Hattrup, K., Jochum, M.
- Web Surveys: Methodological Problems and Research Perspectives; 2012; Biffignandi, S., Bethlehem, J.
- Increasing Response Rate in Web-Based/Internet Surveys; 2012; Manzo, A. N., Burke, J. M.
- Open-ended Questions in Web Surveys: One Large vs. Ten Small Boxes; 2012; Keusch, F.
- Effects of Pagination on Short Online Surveys; 2012; Sedley, A., Callegaro, M.
- A Systematic Review of Studies Investigating the Quality of Data Obtained with Online Panels; 2012; Callegaro, M., Villar, A., Krosnick, J. A., Yeager, D. S.
- Exploring New Pathways to Survey Recruitment; 2012; Bilgram, V., Stadler, D.Jawecki, G.
- Understanding selection bias in a worldwide, volunteer web-survey; 2012; Tijdens, K., Steinmetz, S.
- Does Mode Matter? Initial Evidence from the German Longitudinal Election Study (GLES); 2012; Blumenstiel, J. E., Rossmann, J.
- The Representativity of Web Surveys of the General Population compared to Traditional Modes and Mixed...; 2012; Klausch, L. T., Schouten, B., Hox, J.
- Surveytainment 2.0: Why investing 10 more minutes more in constructing your questionnaire is worth considering...; 2012; Muehle, A., Tress, F., Schmidt, S., Winkler, T.
- Market research online community (MROC) versus focus group; 2012; Zuber, M.
- Data quality in MAWI and CAWI; 2012; Mavletova, A. M., Blasius, J.
- Time use data collection using Smartphones: Results of a pilot study among experienced and inexperienced...; 2012; Scherpenzeel, A., Sonck, N., Fernee, H., Morren, Me.
- Scrutinizing Dynamics – Rolling panel waves in theory and practice; 2012; Faas, T., Blumenberg, J. N.
- Do Mail and Internet Surveys Produce Different Item Nonresponse Rates? An Experiment Using Random Mode...; 2012; Millar, M. M., Dillman, D. A.
- Item Nonresponse in a Client Survey of the General Public; 2012; Israel, G. D., Lamm, A. J.
- Comparing Item Nonresponse across Different Delivery Modes in General Population Surveys; 2012; Lesser, V. M., Newton, L., Yang, D.
- Determinants of Item Nonresponse to Web and Mail Respondents in Three Address-Based Mixed-Mode Surveys...; 2012; Messer, B. L., Edwards, M. L., Dillman, D. A.
- Little experience with technology as a cause of nonresponse in online surveys; 2012; Struminskaya, B., Schaurer, I., Kaczmirek, L., Bandilla, W.
- Automatic Forwarding on Web Surveys – Some Outlines and Remarks; 2012; Selkaelae, A.
- Thinking, Planning & Operationalizing Empirical Mixed Methods Research Design; 2012; Ruhi, U.
- Continuous large-scale volunteer web-surveys: The experience of Lohnspiegel and WageIndicator; 2012; Oez, F.
- Is Pretesting Established Among Online Survey Tool Users?; 2012
- An Evaluation of Two Non-Reactive Web Questionnaire Pretesting Methods; 2012; Lenzner, T.
- Recommendations for implementing online surveys and simple experiments in social and behavioural research...; 2012; Hewson, C. M.
- High potential for mobile Web surveys: Findings from a survey representative for German Internet users...; 2012; Funke, F., Wachenfeld, A.
- A taxonomy of paradata for web surveys and computer assisted self interviewing (Casi); 2012; Callegaro, M.
- Can Social Media Research replace traditional research methods?; 2012; Faber, T., Einhorn, M., Hofmann, O., Loeffler, M.
- Bad Boy Matrix Question – Whatcha gonna do when they come for you?; 2012; Tress, F.
- Matrix vs. Single Question Formats in Web Surveys: Results from a large scale experiment; 2012; Klausch, L. T., de Leeuw, E. D., Hox, J., de Jongh, A., Roberts , A.
- Effects of Static versus Dynamic Formatting Instructions for Open-Ended Numerical Questions in Web Surveys...; 2012; Kunz, T., Fuchs, M.
- FamilyVote – Conducting online surveys with children and families; 2012; Geissler, H., Peeters, H.
- The influence of social desirability on data quality in face-to-face and web surveys; 2012; Keusch, F.
- Reducing the Threat of Sensitive Questions in Online Surveys; 2012; Couper, M. P.
- Using the Internet to Administer More Realistic Vignette Experiments; 2012; Caro, F. G., Ho, T., McFadden, D., Gottlieb, A. S., Yee, C., Chan, T.,
- Succinct Survey Measures of Web-Use Skills; 2012; Hargittai, E., Hsieh, Y. P.